Meta’s Move: Protecting Teens from Harmful Content on Social Media

WhatsApp Group Join Now
Telegram Group Join Now

Meta, the company behind Instagram and Facebook, announced on Tuesday that it will take steps to hide inappropriate content related to suicide, self-harm, and eating disorders from the accounts of teenagers. This decision comes as part of their efforts to create a safer online environment for young users.

In a blog post, Meta stated that even if a teen follows an account that shares such content, it won’t appear in their feed. Additionally, teen accounts, assuming the user provided accurate age information during sign-up, will have the most restrictive settings by default, preventing them from searching for potentially harmful terms.

Meta emphasized its commitment to providing age-appropriate experiences for teens on their platforms. The move is seen as a response to ongoing lawsuits from numerous U.S. states, accusing Meta of contributing to the mental health challenges of young people through features that allegedly promote addiction.

Despite this announcement, critics argue that Meta’s actions are insufficient. Josh Golin, the executive director of the children’s online advocacy group Fairplay, criticized the timing of the changes, questioning why Meta waited until 2024 to address such concerns.

WhatsApp Group Join Now
Telegram Group Join Now
Back to top button